Team, Visitors, External Collaborators
Overall Objectives
Research Program
Application Domains
Highlights of the Year
New Software and Platforms
New Results
Bilateral Contracts and Grants with Industry
Partnerships and Cooperations
Dissemination
Bibliography
XML PDF e-pub
PDF e-Pub


Section: New Results

Analysis of the global convergence of (fast) incremental EM methods

The EM algorithm is one of the most popular algorithm for inference in latent data models. The original formulation of the EM algorithm does not scale to large data set, because the whole data set is required at each iteration of the algorithm. To alleviate this problem, Neal and Hinton (1998) have proposed an incremental version of the EM (iEM) in which at each iteration the conditional expectation of the latent data (E-step) is updated only for a mini-batch of observations. Another approach has been proposed by Cappé and Moulines (2009) in which the E-step is replaced by a stochastic approximation step, closely related to stochastic gradient. In this study, we analyzed incremental and stochastic version of the EM algorithm in a common unifying framework. We also introduced a new version incremental version, inspired by the SAGA algorithm by Defazio et al. (2014). We established non-asymptotic convergence bounds for global convergence, [15].